skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Cho, In Ho"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Multifunctional nanosurfaces receive growing attention due to their versatile properties. Capillary force lithography (CFL) has emerged as a simple and economical method for fabricating these surfaces. In recent works, the authors proposed to leverage the evolution strategies (ES) to modify nanosurface characteristics with CFL to achieve specific functionalities such as frictional, optical, and bactericidal properties. For artificial intelligence (AI)-driven inverse design, earlier research integrates basic multiphysics principles such as dynamic viscosity, air diffusivity, surface tension, and electric potential with backward deep learning (DL) on the framework of ES. As a successful alternative to reinforcement learning, ES performed well for the AI-driven inverse design. However, the computational limitations of ES pose a critical technical challenge to achieving fast and efficient design. This paper addresses the challenges by proposing a parallel-computing-based ES (named parallel ES). The parallel ES demonstrated the desired speed and scalability, accelerating the AI-driven inverse design of multifunctional nanopatterned surfaces. Detailed parallel ES algorithms and cost models are presented, showing its potential as a promising tool for advancing AI-driven nanomanufacturing. 
    more » « less
    Free, publicly-accessible full text available January 1, 2026
  2. Nanopatterned tribocharge can be generated on the surface of elastomers through their replica molding with nanotextured molds. Despite its vast application potential, the physical conditions enabling the phenomenon have not been clarified in the framework of analytical mechanics. Here, we explain the final tribocharge pattern by separately applying two models, namely cohesive zone failure and cumulative fracture energy, as a function of the mold nanotexture’s aspect ratio. These models deepen our understanding of the triboelectrification phenomenon. 
    more » « less
  3. Abstract Predicting individual large earthquakes (EQs)’ locations, magnitudes, and timing remains unreachable. The author’s prior study shows that individual large EQs have unique signatures obtained from multi-layered data transformations. Via spatio-temporal convolutions, decades-long EQ catalog data are transformed into pseudo-physics quantities (e.g., energy, power, vorticity, and Laplacian), which turn into surface-like information via Gauss curvatures. Using these new features, a rule-learning machine learning approach unravels promising prediction rules. This paper suggests further data transformation via Fourier transformation (FT). Results show that FT-based new feature can help sharpen the prediction rules. Feasibility tests of large EQs ($$M\ge$$ M 6.5) over the past 40 years in the western U.S. show promise, shedding light on data-driven prediction of individual large EQs. The handshake among ML methods, Fourier, and Gauss may help answer the long-standing enigma of seismogenesis. 
    more » « less
  4. The scientific community has been looking for novel approaches to develop nanostructures inspired by nature. However, due to the complicated processes involved, controlling the height of these nanostructures is challenging. Nanoscale capillary force lithography (CFL) is one way to use a photopolymer and alter its properties by exposing it to ultraviolet radiation. Nonetheless, the working mechanism of CFL is not fully understood due to a lack of enough information and first principles. One of these obscure behaviors is the sudden jump phenomenon—the sudden change in the height of the photopolymer depending on the UV exposure time and height of nano-grating (based on experimental data). This paper uses known physical principles alongside artificial intelligence to uncover the unknown physical principles responsible for the sudden jump phenomenon. The results showed promising results in identifying air diffusivity, dynamic viscosity, surface tension, and electric potential as the previously unknown physical principles that collectively explain the sudden jump phenomenon. 
    more » « less
  5. Abstract Nature finds a way to leverage nanotextures to achieve desired functions. Recent advances in nanotechnologies endow fascinating multi-functionalities to nanotextures by modulating the nanopixel’s height. But nanoscale height control is a daunting task involving chemical and/or physical processes. As a facile, cost-effective, and potentially scalable remedy, the nanoscale capillary force lithography (CFL) receives notable attention. The key enabler is optical pre-modification of photopolymer’s characteristics via ultraviolet (UV) exposure. Still, the underlying physics of the nanoscale CFL is not well understood, and unexplained phenomena such as the “forbidden gap” in the nano capillary rise (unreachable height) abound. Due to the lack of large data, small length scales, and the absence of first principles, direct adoptions of machine learning or analytical approaches have been difficult. This paper proposes a hybrid intelligence approach in which both artificial and human intelligence coherently work together to unravel the hidden rules with small data. Our results show promising performance in identifying transparent, physics-retained rules of air diffusivity, dynamic viscosity, and surface tension, which collectively appear to explain the forbidden gap in the nanoscale CFL. This paper promotes synergistic collaborations of humans and AI for advancing nanotechnology and beyond. 
    more » « less
  6. Machine learning (ML) advancements hinge upon data - the vital ingredient for training. Statistically-curing the missing data is called imputation, and there are many imputation theories and tools. Butthey often require difficult statistical and/or discipline-specific assumptions, lacking general tools capable of curing large data. Fractional hot deck imputation (FHDI) can cure data by filling nonresponses with observed values (thus, hot-deck) without resorting to assumptions. The review paper summarizes how FHDI evolves to ultra dataoriented parallel version (UP-FHDI).Here, ultra data have concurrently large instances (bign) and high dimensionality (big-p). The evolution is made possible with specialized parallelism and fast variance estimation technique. Validations with scientific and engineering data confirm that UP-FHDI can cure ultra data(p >10,000& n > 1M), and the cured data sets can improve the prediction accuracy of subsequent ML. The evolved FHDI will help promote reliable ML with cured big data. 
    more » « less
  7. Abstract Statistical descriptions of earthquakes offer important probabilistic information, and newly emerging technologies of high-precision observations and machine learning collectively advance our knowledge regarding complex earthquake behaviors. Still, there remains a formidable knowledge gap for predicting individual large earthquakes’ locations and magnitudes. Here, this study shows that the individual large earthquakes may have unique signatures that can be represented by new high-dimensional features—Gauss curvature-based coordinates. Particularly, the observed earthquake catalog data are transformed into a number of pseudo physics quantities (i.e., energy, power, vorticity, and Laplacian) which turn into smooth surface-like information via spatio-temporal convolution, giving rise to the new high-dimensional coordinates. Validations with 40-year earthquakes in the West U.S. region show that the new coordinates appear to hold uniqueness for individual large earthquakes ($$M_w \ge 7.0$$ M w 7.0 ), and the pseudo physics quantities help identify a customized data-driven prediction model. A Bayesian evolutionary algorithm in conjunction with flexible bases can identify a data-driven model, demonstrating its promising reproduction of individual large earthquake’s location and magnitude. Results imply that an individual large earthquake can be distinguished and remembered while its best-so-far model can be customized by machine learning. This study paves a new way to data-driven automated evolution of individual earthquake prediction. 
    more » « less